Secant penalized BFGS: a noise robust quasi-Newton method via penalizing the secant condition

نویسندگان

چکیده

In this paper, we introduce a new variant of the BFGS method designed to perform well when gradient measurements are corrupted by noise. We show that treating secant condition with penalty approach motivated regularized least squares estimation generates parametric family original update at one extreme and not updating inverse Hessian approximation other extreme. Furthermore, find curvature is relaxed as moves towards approximation, disappears entirely where updated. These developments allow us develop refer Secant Penalized (SP-BFGS) allows relax based on amount noise in measurements. SP-BFGS provides means incrementally controlled bias previous which replace overwriting nature an averaging resists destructive effects can cope negative discuss theoretical properties SP-BFGS, including convergence minimizing strongly convex functions presence uniformly bounded Finally, present extensive numerical experiments using over 30 problems from CUTEst test problem set demonstrate superior performance compared both noisy function evaluations.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Quasi-Newton updates with weighted secant equations

We provide a formula for variational quasi-Newton updates with multiple weighted secant equations. The derivation of the formula leads to a Sylvester equation in the correction matrix. Examples are given.

متن کامل

Newton - Secant method for solving operator equations ∗

where F is a Fréchet-differentiable operator defined on an open subset D of a Banach space X with values in a Banach space Y . Finding roots of Eq.(1) is a classical problem arising in many areas of applied mathematics and engineering. In this study we are concerned with the problem of approximating a locally unique solution α of Eq.(1). Some of the well known methods for this purpose are the f...

متن کامل

Low complexity secant quasi-Newton minimization algorithms for nonconvex functions

In this work some interesting relations between results on basic optimization and algorithms for nonconvex functions (such as BFGS and secant methods) are pointed out. In particular, some innovative tools for improving our recent secant BFGS-type and LQN algorithms are described in detail. © 2006 Elsevier B.V. All rights reserved. MSC: 51M04; 65H20; 65F30; 90C53

متن کامل

The modified BFGS method with new secant relation ‎for unconstrained optimization problems‎

Using Taylor's series we propose a modified secant relation to get a more accurate approximation of the second curvature of the objective function. Then, based on this modified secant relation we present a new BFGS method for solving unconstrained optimization problems. The proposed method make use of both gradient and function values while the usual secant relation uses only gradient values. U...

متن کامل

Structured minimal-memory inexact quasi-Newton method and secant preconditioners for augmented Lagrangian optimization

Augmented Lagrangian methods for large-scale optimization usually require efficient algorithms for minimization with box constraints. On the other hand, active-set box-constraint methods employ unconstrained optimization algorithms for minimization inside the faces of the box. Several approaches may be employed for computing internal search directions in the large-scale case. In this paper a mi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Computational Optimization and Applications

سال: 2023

ISSN: ['0926-6003', '1573-2894']

DOI: https://doi.org/10.1007/s10589-022-00448-x